376 research outputs found

    End-to-End Privacy for Open Big Data Markets

    Get PDF
    The idea of an open data market envisions the creation of a data trading model to facilitate exchange of data between different parties in the Internet of Things (IoT) domain. The data collected by IoT products and solutions are expected to be traded in these markets. Data owners will collect data using IoT products and solutions. Data consumers who are interested will negotiate with the data owners to get access to such data. Data captured by IoT products will allow data consumers to further understand the preferences and behaviours of data owners and to generate additional business value using different techniques ranging from waste reduction to personalized service offerings. In open data markets, data consumers will be able to give back part of the additional value generated to the data owners. However, privacy becomes a significant issue when data that can be used to derive extremely personal information is being traded. This paper discusses why privacy matters in the IoT domain in general and especially in open data markets and surveys existing privacy-preserving strategies and design techniques that can be used to facilitate end to end privacy for open data markets. We also highlight some of the major research challenges that need to be address in order to make the vision of open data markets a reality through ensuring the privacy of stakeholders.Comment: Accepted to be published in IEEE Cloud Computing Magazine: Special Issue Cloud Computing and the La

    Health-Promoting Food Ingredients and Functional Food Processing

    Get PDF

    Investigation into Impact of Ageing on Rubber Component in Used Engine Mount

    Get PDF
    A 2014 KPMG customer survey report demonstrated the increasing demand of driving comfort and sustainable development. With the longer lifespan of modern vehicles, more attention has been placed on to products’ lifetime performance. Ageing of rubber components in the engine mount was known to be one of the key elements related to the compromised driving experience in used vehicles. This thesis will investigate how the properties of the rubber component change, and why. Links among mechanical properties, microstructures and chemical composition of the aged carbon black filled vulcanised natural rubber used in a commercial engine mount are to be revealed. By investigating used engine mounts, the changes of stiffness for rubber in the used engine mount was established, which were identified to be related to post-curing, thermal degradation, oxidative degradation, filler re-agglomeration and loss of additives. Among these ageing mechanisms, the most dominating factors were post-curing and loss of additives, which increased the stiffness of the rubber by 45% in a four-year-old car that has driven by 80 thousand kilometres. The impact of the acting ageing mechanisms was identified through aerobic and anaerobic artificial ageing experiments. The artificial ageing experiments provided knowledge about how each ageing mechanism progresses in the material and how they interact with each other. It also demonstrated the limitations of artificial ageing on simulating certain ageing mechanisms. This is the first time such a comprehensive investigation has been made to identify the causes of different ageing mechanisms on specimens from real vehicles and discuss how the ageing mechanisms impact on the material individually. Hopefully this work could provide useful information for the industry and other academics in the area when designing rubber relevant products or investigating ageing behaviour of similar materials

    Virtual Environments for multiphysics code validation on Computing Grids

    Get PDF
    We advocate in this paper the use of grid-based infrastructures that are designed for seamless approaches to the numerical expert users, i.e., the multiphysics applications designers. It relies on sophisticated computing environments based on computing grids, connecting heterogeneous computing resources: mainframes, PC-clusters and workstations running multiphysics codes and utility software, e.g., visualization tools. The approach is based on concepts defined by the HEAVEN* consortium. HEAVEN is a European scientific consortium including industrial partners from the aerospace, telecommunication and software industries, as well as academic research institutes. Currently, the HEAVEN consortium works on a project that aims to create advanced services platforms. It is intended to enable "virtual private grids" supporting various environments for users manipulating a suitable high-level interface. This will become the basis for future generalized services allowing the integration of various services without the need to deploy specific grid infrastructures

    An infrastructure service recommendation system for cloud applications with real-time QoS requirement constraints

    Get PDF
    The proliferation of cloud computing has revolutionized the hosting and delivery of Internet-based application services. However, with the constant launch of new cloud services and capabilities almost every month by both big (e.g., Amazon Web Service and Microsoft Azure) and small companies (e.g., Rackspace and Ninefold), decision makers (e.g., application developers and chief information officers) are likely to be overwhelmed by choices available. The decision-making problem is further complicated due to heterogeneous service configurations and application provisioning QoS constraints. To address this hard challenge, in our previous work, we developed a semiautomated, extensible, and ontology-based approach to infrastructure service discovery and selection only based on design-time constraints (e.g., the renting cost, the data center location, the service feature, etc.). In this paper, we extend our approach to include the real-time (run-time) QoS (the end-to-end message latency and the end-to-end message throughput) in the decision-making process. The hosting of next-generation applications in the domain of online interactive gaming, large-scale sensor analytics, and real-time mobile applications on cloud services necessitates the optimization of such real-time QoS constraints for meeting service-level agreements. To this end, we present a real-time QoS-aware multicriteria decision-making technique that builds over the well-known analytic hierarchy process method. The proposed technique is applicable to selecting Infrastructure as a Service (IaaS) cloud offers, and it allows users to define multiple design-time and real-time QoS constraints or requirements. These requirements are then matched against our knowledge base to compute the possible best fit combinations of cloud services at the IaaS layer. We conducted extensive experiments to prove the feasibility of our approach

    A cumulus project: design and implementation

    Get PDF
    The Cloud computing becomes an innovative computing paradigm, which aims to provide reliable, customized and QoS guaranteed computing infrastructures for users. This paper presents our early experience of Cloud computing based on the Cumulus project for compute centers. In this paper, we introduce the Cumulus project with its various aspects, such as design pattern, infrastructure, and middleware

    Cloud Computing: A Perspective Study

    Get PDF
    The Cloud computing emerges as a new computing paradigm which aims to provide reliable, customized and QoS guaranteed dynamic computing environments for end-users. In this paper, we study the Cloud computing paradigm from various aspects, such as definitions, distinct features, and enabling technologies. This paper brings an introductional review on the Cloud computing and provide the state-of-the-art of Cloud computing technologies
    • …
    corecore